Wide interval for efficient self-scaling quasi-Newton algorithms

نویسندگان

  • Mehiddin Al-Baali
  • Humaid Khalfan
چکیده

This paper uses certain conditions for the global and superlinear convergence of the two-parameter self-scaling Broyden family of quasi-Newton algorithms for unconstraiend optimization to derive a wide interval for self-scaling updates. Numerical testing shows that such algorithms not only accelerate the convergence of the (unscaled) methods from the so-called convex class, but increase their chances of success as well. Self-scaling updates from the preconvex and the postconvex classes are shown to be effective in practice, and new algorithms which work well in practice, with or without scaling, are also obtained from the new interval. Unlike the behaviour of unscaled methods, numerical testing shows that varying the updating parameter in the proposed interval has little effect on the performance of the self-scaling algorithms.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Self-Scaling Parallel Quasi-Newton Methods

In this paper, a new class of self-scaling quasi-Newton(SSQN) updates for solving unconstrained nonlinear optimization problems(UNOPs) is proposed. It is shown that many existing QN updates can be considered as special cases of the new family. Parallel SSQN algorithms based on this class of class of updates are studied. In comparison to standard serial QN methods, proposed parallel SSQN(SSPQN) ...

متن کامل

Inexact-hessian-vector Products for Efficient Reduced-space Pde-constrained Optimization

We investigate reduced-space Newton-Krylov (NK) algorithms for engineering parameter optimization problems constrained by partial-differential equations. We review reduced-space and full-space optimization algorithms, and we show that the efficiency of the reduced-space strategy can be improved significantly with inexact-Hessianvector products computed using approximate second-order adjoints. R...

متن کامل

Self-Scaling Variable Metric Algorithms Without Line Search for Unconstrained Minimization*

This paper introduces a new class of quasi-Newton algorithms for unconstrained minimization in which no line search is necessary and the inverse Hessian approximations are positive definite. These algorithms are based on a two-parameter family of rank two, updating formulae used earlier with line search in self-scaling variable metric algorithms. It is proved that, in a quadratic case, the new ...

متن کامل

Self-Scaling Variable Metric Algorithms Without Line

This paper introduces a new class of quasi-Newton algorithms for unconstrained minimization in which no line search is necessary and the inverse Hessian approximations are positive definite. These algorithms are based on a two-parameter family of rank two, updating formulae used earlier with line search in self-scaling variable metric algorithms. It is proved that, in a quadratic case, the new ...

متن کامل

The Global Convergence of Self-Scaling BFGS Algorithm with Nonmonotone Line Search for Unconstrained Nonconvex Optimization Problems

The self-scaling quasi-Newton method solves an unconstrained optimization problem by scaling the Hessian approximation matrix before it is updated at each iteration to avoid the possible large eigenvalues in the Hessian approximation matrices of the objective function. It has been proved in the literature that this method has the global and superlinear convergence when the objective function is...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • Optimization Methods and Software

دوره 20  شماره 

صفحات  -

تاریخ انتشار 2005